Web Survey Bibliography
Title Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment
Author Hoehne, J. K.; Schlosser, S.; Krebs, D.
Year 2016
Access date 03.08.2016
Full text PDF (1,01MB)
Abstract
Relevance and Research Question: The measurement of attitudes, opinions, and behaviors with agree/disagree questions is a common and popular method in empirical social research. For instance, in the German General Social Survey, the Eurobarometer, and the ISSP this question type is frequently used. Fowler (1995), however, suggests that agree/disagree questions require an effortful and intricate cognitive information processing; he argues for the use of item-specific questions because they seem to be less burdensome. So far, this assumption lacks empirical evidence.
Methods and Data: In the current study, we examine cognitive burden of agree/disagree and item-specific questions in web surveys using paradata. Measuring response times makes it possible to examine the cognitive burden of different question types and provides insights into cognitive response processes. We used an innovative double-stage outlier correction that is based on the activity of the web survey while processing, followed by an outlier definition that is based on the distribution of the response times. Additionally, we captured computer mouse clicks to evaluate response times. We conducted a two group experiment that is based on an onomastic sampling approach. The first experimental group (n = 533) received eight agree/disagree questions on achievement motivation. The second group (n = 472) received eight similar item-specific questions on achievement motivation.
Results: Our findings suggest that the item-specific questions show, on average, significantly higher response times than their agree/disagree counterparts. The computer mouse clicks, however, show no significant differences between the two experimental groups. Hence, the question types do not seem to affect response times systematically.
Added Value: Altogether, it appears that item-specific questions contrary to the current state of research require a deeper cognitive information processing than agree/disagree questions.
Methods and Data: In the current study, we examine cognitive burden of agree/disagree and item-specific questions in web surveys using paradata. Measuring response times makes it possible to examine the cognitive burden of different question types and provides insights into cognitive response processes. We used an innovative double-stage outlier correction that is based on the activity of the web survey while processing, followed by an outlier definition that is based on the distribution of the response times. Additionally, we captured computer mouse clicks to evaluate response times. We conducted a two group experiment that is based on an onomastic sampling approach. The first experimental group (n = 533) received eight agree/disagree questions on achievement motivation. The second group (n = 472) received eight similar item-specific questions on achievement motivation.
Results: Our findings suggest that the item-specific questions show, on average, significantly higher response times than their agree/disagree counterparts. The computer mouse clicks, however, show no significant differences between the two experimental groups. Hence, the question types do not seem to affect response times systematically.
Added Value: Altogether, it appears that item-specific questions contrary to the current state of research require a deeper cognitive information processing than agree/disagree questions.
Access/Direct link Conference Homepage (abstract)
Year of publication2016
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (4086)
- Question order sensitivity of subjective well-being measures: focus on life satisfaction, self-rated...; 2016; Lee, S.; McClain, C.; Webster, N.; Han, S.
- Are Final Comments in Web Survey Panels Associated with Next-Wave Attrition?; 2016; McLauchlan, C.; Schonlau, M.
- Estimation and Adjustment of Self-Selection Bias in Volunteer Panel Web Surveys ; 2016; Niu, Ch.
- Facebook, Twitter, & Qr codes: An exploratory trial examining the feasibility of social media mechanisms...; 2016; Gu, L. L.; Skierkowski, D.; Florin, P.; Friend, K.; Ye, Y.
- Sensitive Questions in Online Surveys: An Experimental Evaluation of Different Implementations of the...; 2016; Hoglinger, M.; Jann, B.; Diekmann, A.
- Design and test of a web-survey for collecting observer’s ratings on dairy goats’ behavioural...; 2016; Vieira, A.; Oliveira, M. D.; Nunes, T.; Stilwell, G.
- Análisis de herramientas gratuitas para el diseño de cuestionarios on-line; 2016; Montoya, L. S.; Farran, C. X.; Catala, C. M.
- Participation in an Intensive Longitudinal Study with Weekly Web Surveys Over 2.5 Years; 2016; Barber, J. S.; Kusunoki, Y.; Gatny, H. H.; Schulz, P.
- Helping respondents provide good answers in Web surveys; 2016; Couper, M. P.; Zhang, C.
- Geht’s auch mit der Maus? – Eine Methodenstudie zu Online-Befragungen in der Jugendforschung...; 2016; Heim, R.; Konowalczyk, S.; Grgic, M.; Seyda, M.; Burrmann, U.; Rauschenbach, T.
- Shorter Interviews, Longer Surveys: Optimising the survey participant experience whilst accommodating...; 2016; Halder, A.; Bansal, H. S.; Knowles, R.; Eldridge, J.; Murray, Mi.
- Gamifying. Not all fun and games; 2016; Stubington, P.; Crichton, C.
- Are interviews costing £0.08 a waste of money? Reviewing Google Surveys for Wisdom of the Crowd...; 2016; Roughton, G.; MacKay, I.
- Observations from Twelve Years of an Annual Market Research Technology Survey; 2016; Macer, T.; Wilson, S.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Last Year Your Answer Was … The Impact of Dependent Interviewing Wording and Survey Factors on...; 2016; Al Baghal, T.
- The Effects of a Delayed Incentive on Response Rates, Response Mode, Data Quality, and Sample Bias in...; 2016; McGonagle, K., Freedman, V. A.
- Can Student Populations in Developing Countries Be Reached by Online Surveys? The Case of the National...; 2016; Langer, A., Meuleman, B., Oshodi, A.-G. T., Schroyens, M.
- The Effects of Vignette Placement on Attitudes Toward Supporting Family Members; 2016; Lau, C. Q., Seltzer, J. A., Bianchi, S. M.
- Comparisons of Online Recruitment Strategies for Convenience Samples: Craigslist, Google AdWords, Facebook...; 2016; Antoun, C., Zhang, C., Conrad, F. G., Schober, M. F.
- Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?; 2016; Meitinger, K., Behr, D.
- A new model for concept evaluation; 2016; Allen, D. R.
- Feature phones no barrier to conducting an effective conjoint study ; 2016; de Rooij, R.; Dossin, R.
- A look at the unique data-gathering process behind the Harvard Impact Study; 2016; Vitale, J.
- Are sliders too slick for surveys?; 2016; Buskirk, T. D.
- Research gamification for quality pharmaceutical stakeholder insights; 2016; Mondry, B.; Fink, L.
- The impact of survey duration on completion rates among Millennial respondents ; 2016; Coates, D.; Bliss, M.; Vivar, X.
- SurveyTester from Knowledge Navigators ; 2016; Macer, T.
- Marrying passive and custom data for effective mobile targeting; 2016; King, K.; Stevens, N.
- Simplifying your mobile solution; 2016; Berry, K.
- How to maximize survey response rates ; 2016; DeVall, R.; Colby, C.
- Participation rates of childhood cancer survivors to self-administered questionnaires: a systematic...; 2016; Kilsdonk, E.; Wendel, E.; van Dulmen-den Broeder, E.; van Leeuwen, F.E.; Van Den Berg, M. H.; Jaspers...
- Google's MIDAS Touch: Predicting UK Unemployment with Internet Search Data; 2016; Smith, Pau.
- Patient preference: a comparison of electronic patient-completed questionnaires with paper among cancer...; 2016; Martin, P.; Brown, M.C.; Espin‐Garcia, O.; Cuffe, S.; Pringle, D.; Mahler, M.; Villeneuve, J.;...
- Mixed Mode Research: Issues in Design and Analysis; 2016; Hox, J.; De Leeuw, E. D.; Klausch, L. T.
- Does the Use of Smartphones to Participate in Web Surveys Affect the Survey Experience when Sensitive...; 2016; Toninelli, D.; Revilla, M.
- Device use in web surveys: The effect of differential incentives; 2016; Mavletova, A. M.; Couper, M. P.
- Device Effects - How different screen sizes affect answers in online surveys; 2016; Fisher, B.; Bernet, F.
- Effects of motivating question types with graphical support in multi channel design studies; 2016; Luetters, H.; Friedrich-Freksa, M.; Vitt, SGoldstein, D. G.
- Analyzing Cognitive Burden of Survey Questions with Paradata: A Web Survey Experiment; 2016; Hoehne, J. K.; Schlosser, S.; Krebs, D.
- Why Do Web Surveys Take Longer on Smartphones?; 2016; Couper, M. P.; J. J.Peterson, G. J.
- Do Initial Respondents Differ From Callback Respondents? Lessons From a Mobile CATI Survey; 2016; Vicente, P.; Marques, C.
- Secondary Respondent Consent in the German Family Panel; 2016; Schmiedeberg, C.; Castiglioni, L.; Schroeder, J.
- Online Focus Group Discussion is a Valid and Feasible Mode When Investigating Sensitive Topics Among...; 2016; Wettergren, L.; Eriksson, L. E.; Nilsson, J.; Jarvaeus, A.; Lampic, C.
- A look into the challenges of mixed-mode surveys; 2016; Klausch, L. T.
- The use of online social networks as a promotional tool for self-administered internet surveys; 2016; de Rada, V. D.; Arino, L. V. C; Blasco, M. G
- Optimizing Self-response for the 2020 Census ; 2016; Bentley, M.
- Improving Data Quality in a Web Survey of Youth and Teens ; 2016; Horton, V. M.; Branson, R.; Phillips, B. T.; Fowlkes, E.
- Impact of Field Period Length and Contact Attempts on Representativeness for Web Survey ; 2016; Bertoni, N.; Turakhia, C.; Magaw, R.; Ackermann, A.
- Have You Taken Your Survey Yet? Optimum Interval for Reminders in Web Panel Surveys ; 2016; Kanitkar, K. N.; Liu, D.